Last data update: May 06, 2024. (Total: 46732 publications since 2009)
Records 1-14 (of 14 Records) |
Query Trace: Williamson JM[original query] |
---|
Sensitivity and Uncertainty Analysis for Two-Stream Capture-Recapture Methods in Disease Surveillance (preprint)
Zhang Y , Chen J , Ge L , Williamson JM , Waller LA , Lyles RH . medRxiv 2022 23 Capture-recapture methods are widely applied in estimating the number (N) of prevalent or cumulatively incident cases in disease surveillance. Here, we focus the bulk of our attention on the common case in which there are two data streams. We propose a sensitivity and uncertainty analysis framework grounded in multinomial distribution-based maximum likelihood, hinging on a key dependence parameter that is typically non-identifiable but is epidemiologically interpretable. Focusing on the epidemiologically meaningful parameter unlocks appealing data visualizations for sensitivity analysis and provides an intuitively accessible framework for uncertainty analysis designed to leverage the practicing epidemiologist's understanding of the implementation of the surveillance streams as the basis for assumptions driving estimation of N. By illustrating the proposed sensitivity analysis using publicly available HIV surveillance data, we emphasize both the need to admit the lack of information in the observed data and the appeal of incorporating expert opinion about the key dependence parameter. The proposed uncertainty analysis is an empirical Bayes-like approach designed to more realistically acknowledge variability in the estimated N associated with uncertainty in an expert's opinion about the non-identifiable parameter, together with the statistical uncertainty. We demonstrate how such an approach can also facilitate an appealing general interval estimation procedure to accompany capture-recapture methods. Simulation studies illustrate the reliable performance of the proposed approach for quantifying uncertainties in estimating N in various contexts. Finally, we demonstrate how the recommended paradigm has the potential to be directly extended for application to data from more than two surveillance streams. Copyright The copyright holder for this preprint is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. All rights reserved. No reuse allowed without permission. |
Sensitivity and uncertainty analysis for two-stream capture-recapture methods in disease surveillance
Zhang Y , Chen J , Ge L , Williamson JM , Waller LA , Lyles RH . Epidemiology 2023 34 (4) 601-610 Capture-recapture methods are widely applied in estimating the number (N) of prevalent or cumulatively incident cases in disease surveillance. Here, we focus the bulk of our attention on the common case in which there are two data streams. We propose a sensitivity and uncertainty analysis framework grounded in multinomial distribution-based maximum likelihood, hinging on a key dependence parameter that is typically non-identifiable but is epidemiologically interpretable. Focusing on the epidemiologically meaningful parameter unlocks appealing data visualizations for sensitivity analysis and provides an intuitively accessible framework for uncertainty analysis designed to leverage the practicing epidemiologist's understanding of the implementation of the surveillance streams as the basis for assumptions driving estimation of N. By illustrating the proposed sensitivity analysis using publicly available HIV surveillance data, we emphasize both the need to admit the lack of information in the observed data and the appeal of incorporating expert opinion about the key dependence parameter. The proposed uncertainty analysis is a simulation-based approach designed to more realistically acknowledge variability in the estimated N associated with uncertainty in an expert's opinion about the non-identifiable parameter, together with the statistical uncertainty. We demonstrate how such an approach can also facilitate an appealing general interval estimation procedure to accompany capture-recapture methods. Simulation studies illustrate the reliable performance of the proposed approach for quantifying uncertainties in estimating N in various contexts. Finally, we demonstrate how the recommended paradigm has the potential to be directly extended for application to data from more than two surveillance streams. |
A censored quantile regression approach for relative survival analysis: Relative survival quantile regression
Williamson JM , Lin HM , Lyles RH . Biom J 2023 65 (5) e2200127 We propose a censored quantile regression model for the analysis of relative survival data. We create a hybrid data set consisting of the study observations and counterpart randomly sampled pseudopopulation observations imputed from population life tables that adjust for expected mortality. We then fit a censored quantile regression model to the hybrid data incorporating demographic variables (e.g., age, biologic sex, calendar time) corresponding to the population life tables of demographically-similar individuals, a population versus study covariate, and its interactions with the variables of interest. These latter variables can be interpreted as relative survival parameters that depict the differences in failure quantiles between the study participants and their population counterparts. |
Firth adjustment for Weibull current-status survival analysis
Lin HM , Williamson JM , Kim HY . Commun Stat Theory Methods 2019 49 (18) 4587-4602 Analysis of right-censored data is problematic due to infinite maximum likelihood estimates (MLE) and potentially biased estimates, especially for small numbers of events. Analyzing current-status data is especially troublesome because of the extreme loss of precision due to large failure intervals. We extend Firth’s method for regular parametric problems to current-status modeling with the Weibull distribution. Firth advocated a bias reduction method for MLE by systematically correcting the score equation. An advantage is that it is still applicable when the MLE does not exist. We present simulation studies and two illustrative analyses involving RFM mice lung tumor data. |
Relative contribution of schistosomiasis and malaria to anemia in Western Kenya
Valice EM , Wiegand RE , Mwinzi PNM , Karanja DMS , Williamson JM , Ochola E , Samuels A , Verani JR , Leon JS , Secor WE , Montgomery SP . Am J Trop Med Hyg 2018 99 (3) 713-715 Because anemia is one of the markers of morbidity associated with schistosomiasis, it has been proposed as a potential measure to evaluate the impact of control programs. However, anemia is also a common consequence of malaria, and schistosomiasis and malaria are often co-endemic. To estimate the attributable fraction of anemia due to Schistosoma mansoni and Plasmodium falciparum infections, we applied a log-binomial model to four studies measuring these parameters of a combined 5,849 children in western Kenya. In our studies, malaria contributed 23.3%, schistosomiasis contributed 6.6%, and co-infection contributed 27.6% of the anemia. We conclude that in areas where S. mansoni and P. falciparum are co-endemic, the contribution of schistosomiasis to anemia is masked by anemia resulting from malaria, thus limiting anemia as a useful measure for schistosomiasis control programs in these settings. |
Anaemia in HIV-infected pregnant women receiving triple antiretroviral combination therapy for prevention of mother-to-child transmission: a secondary analysis of the Kisumu breastfeeding study (KiBS)
Odhiambo C , Zeh C , Angira F , Opollo V , Akinyi B , Masaba R , Williamson JM , Otieno J , Mills LA , Lecher SL , Thomas TK . Trop Med Int Health 2016 21 (3) 373-84 OBJECTIVE: The prevalence of anaemia during pregnancy is estimated to be 35-75% in sub-Saharan Africa and is associated with an increased risk of maternal mortality. We evaluated the frequency and factors associated with anaemia in HIV-infected women undergoing antiretroviral (ARV) therapy for prevention of mother-to-child transmission (PMTCT) enrolled in The Kisumu Breastfeeding Study 2003-2009. METHODS: Maternal haematological parameters were monitored from 32 to 34 weeks of gestation to 2 years post-delivery among 522 enrolled women. Clinical and laboratory assessments for causes of anaemia were performed, and appropriate management was initiated. Anaemia was graded using the National Institutes of Health Division of AIDS 1994 Adult Toxicity Tables. Data were analysed using SAS software, v 9.2. The Wilcoxon two-sample rank test was used to compare groups. A logistic regression model was fitted to describe the trend in anaemia over time. RESULTS: At enrolment, the prevalence of any grade anaemia (Hb < 9.4 g/dl) was 61.8%, but fell during ARV therapy, reaching a nadir (7.4%) by 6 months post-partum. A total of 41 women (8%) developed severe anaemia (Hb < 7 g/dl) during follow-up; 2 (4.9%) were hospitalised for blood transfusion, whereas 3 (7.3%) were transfused while hospitalised (for delivery). The greatest proportion of severe anaemia events occurred around delivery (48.8%; n = 20). Anaemia (Hb ≥ 7 and < 9.4 g/dl) at enrolment was associated with severe anaemia at delivery (OR 5.87; 95% CI: 4.48, 7.68, P < 0.01). Few cases of severe anaemia coincided with clinical malaria (24.4%; n = 10) and helminth (7.3%; n = 3) infections. CONCLUSION: Resolution of anaemia among most participants during study follow-up was likely related to receipt of ARV therapy. Efforts should be geared towards addressing common causes of anaemia in HIV-infected pregnant women, prioritising initiation of ARV therapy and management of peripartum blood loss. |
Power and sample size calculations for interval-censored survival analysis
Kim HY , Williamson JM , Lin HM . Stat Med 2015 35 (8) 1390-400 We propose a method for calculating power and sample size for studies involving interval-censored failure time data that only involves standard software required for fitting the appropriate parametric survival model. We use the framework of a longitudinal study where patients are assessed periodically for a response and the only resultant information available to the investigators is the failure window: the time between the last negative and first positive test results. The survival model is fit to an expanded data set using easily computed weights. We illustrate with a Weibull survival model and a two-group comparison. The investigator can specify a group difference in terms of a hazards ratio. Our simulation results demonstrate the merits of these proposed power calculations. We also explore how the number of assessments (visits), and thus the corresponding lengths of the failure intervals, affect study power. The proposed method can be easily extended to more complex study designs and a variety of survival and censoring distributions. + |
A simple approach for sample size calculation for comparing two concordance correlation coefficients estimated on the same subjects
Lin HM , Williamson JM . J Biopharm Stat 2015 25 (6) 1145-60 Some studies are designed to assess the agreement between different raters and/or different instruments in the medical sciences and pharmaceutical research. In practice, the same sample will be used to compare the agreement of two or more assessment methods for simplicity and to take advantage of the positive correlation of the ratings. The concordance correlation coefficient (CCC) is often used as a measure of agreement when the rating is a continuous variable. We present an approach for calculating the sample size required for testing the equality of two CCCs, H0: CCC1 = CCC2 vs. HA: CCC1 not equal CCC2, where two assessment methods are used on the same sample, with two raters resulting in correlated CCC estimates. Our approach is to simulate one large "exemplary" dataset based on the specification of the joint distribution of the pairwise ratings for the two methods. We then create two new random variables from the simulated data that have the same variance-covariance matrix as the two dependent CCC estimates using the Taylor series linearization method. The method requires minimal computing time and can be easily extended to comparing more than two CCCs, or Kappa statistics. |
Emergence of community-acquired, multidrug-resistant invasive nontyphoidal Salmonella disease in rural western Kenya, 2009-2013
Oneko M , Kariuki S , Muturi-Kioi V , Otieno K , Otieno VO , Williamson JM , Folster J , Parsons MB , Slutsker L , Mahon BE , Hamel MJ . Clin Infect Dis 2015 61 Suppl 4 S310-6 BACKGROUND: Nontyphoidal Salmonella (NTS), mainly serotypes Typhimurium and Enteritidis, cause invasive infections with high mortality in children in sub-Saharan Africa. Multidrug resistance is common, and resistance to third-generation cephalosporins has emerged. METHODS: We reviewed clinical features, outcomes, and antimicrobial resistance patterns in invasive NTS infections among children aged 6 weeks to 5 years participating in malaria vaccine studies in an area of high malaria and human immunodeficiency virus (HIV) transmission in Siaya, western Kenya. Blood culture was performed in hospitalized children and pediatric outpatients with prolonged fever. RESULTS: From July 2009 to December 2013, 1696 children aged 6 weeks to 17 months were enrolled into vaccine trials and followed for up to 53 months. We obtained 1692 blood cultures from 847 children. Of 134 bacterial pathogens isolated, 102 (76.1%) were Salmonella serogroup B or D. Invasive NTS disease occurred in 94 (5.5%) children, with an incidence of 1870, 4134, and 6510 episodes per 100 000 person-years overall, in infants, and in HIV-infected children, respectively. Malaria infection within the past 2 weeks occurred in 18.8% (3/16) of invasive NTS episodes in HIV-infected and 66.2% (53/80) in HIV-uninfected children. Case fatality rate was 3.1%. Salmonella group B resistant to ceftriaxone emerged in 2009 and 2010 (6.2% [2/32 isolates]), rising to 56.5% (13/23 isolates) in 2012 and 2013. CONCLUSIONS: Incidence of invasive NTS disease was high in this area of high malaria and HIV transmission, especially in HIV-infected children. Rapidly emerging resistance against ceftriaxone requires urgent reevaluation of antibiotic recommendations and primary prevention of exposure to Salmonella. |
Increased rates of respiratory and diarrheal illnesses in HIV-negative people living with HIV-infected individuals in a densely populated urban slum
Wong JM , Cosmas L , Nyachieo D , Williamson JM , Olack B , Okoth G , Njuguna H , Feikin DR , Burke H , Montgomery JM , Breiman RF . J Infect Dis 2015 212 (5) 745-53 BACKGROUND: Prolonged pathogen shedding and increased duration of illness associated with infections in immunosuppressed individuals put close HIV-negative contacts of HIV-infected people at increased risk of exposure to infectious pathogens. METHODS: We calculated incidence and longitudinal prevalence (number of days per year) of influenza-like illness (ILI), diarrhea, and non-specific febrile illness during 2008 from a population-based surveillance program in the urban slum of Kibera (Kenya) consisting of 1830 HIV-negative household contacts of HIV-infected individuals and 13 677 individuals living in exclusively HIV-negative households. RESULTS: For individuals ≥5 years old, incidence was significantly increased for ILI (IRR, 1.47; P < .05) and diarrhea (IRR, 1.41; P < .05) in HIV-negative household contacts of HIV-infected individuals compared to exclusively HIV-negative households. The risk of illness among HIV-negative people was directly proportional to the number of HIV-infected people living in the home for ILI (IRR, 1.39; P < .05) and diarrhea (IRR, 1.36; P < .01). We found no increased rates of illness in children <5 years old who lived with HIV-infected individuals. CONCLUSIONS: Living with HIV-infected individuals is associated with modestly increased rates of respiratory and diarrheal infections in HIV-negative individuals >5 years old. Targeted interventions are needed, including ensuring that HIV-infected people are receiving appropriate care and treatment. |
Sustained high incidence of injuries from burns in a densely populated urban slum in Kenya: an emerging public health priority
Wong JM , Nyachieo DO , Benzakri NA , Cosmas L , Ondari D , Yekta S , Montgomery JM , Williamson JM , Breiman RF . Burns 2014 40 (6) 1194-200 INTRODUCTION: Ninety-five percent of burn deaths occur in low- and middle-income countries (LMICs); however, longitudinal household-level studies have not been done in urban slum settings, where overcrowding and unsafe cook stoves may increase likelihood of injury. METHODS: Using a prospective, population-based disease surveillance system in the urban slum of Kibera in Kenya, we examined the incidence of household-level burns of all severities from 2006-2011. RESULTS: Of approximately 28,500 enrolled individuals (6000 households), we identified 3072 burns. The overall incidence was 27.9/1000 person-years-of-observation. Children <5 years old sustained burns at 3.8-fold greater rate compared to (p<0.001) those ≥5 years old. Females ≥5 years old sustained burns at a rate that was 1.35-fold (p<0.001) greater than males within the same age distribution. Hospitalizations were uncommon (0.65% of all burns). CONCLUSIONS: The incidence of burns, 10-fold greater than in most published reports from Africa and Asia, suggests that such injuries may contribute more significantly than previously thought to morbidity in LMICs, and may be increased by urbanization. As migration from rural areas into urban slums rapidly increases in many African countries, characterizing and addressing the rising burden of burns is likely to become a public health priority. |
A longitudinal analysis of the effect of mass drug administration on acute inflammatory episodes and disease progression in lymphedema patients in Leogane, Haiti
Eddy BA , Blackstock AJ , Williamson JM , Addiss DG , Streit TG , Beau de Rochars VM , Fox LM . Am J Trop Med Hyg 2014 90 (1) 80-8 We conducted a longitudinal analysis of 117 lymphedema patients in a filariasis-endemic area of Haiti during 1995-2008. No difference in lymphedema progression between those who received or did not receive mass drug administration (MDA) was found on measures of foot (P = 0.24), ankle (P = 0.87), or leg (P = 0.46) circumference; leg volume displacement (P = 0.09), lymphedema stage (P = 0.93), or frequency of adenolymphangitis (ADL) episodes (P = 0.57). Rates of ADL per year were greater after initiation of MDA among both groups (P < 0.01). Nevertheless, patients who received MDA reported improvement in four areas of lymphedema-related quality of life (P ≤ 0.01). Decreases in foot and ankle circumference and ADL episodes were observed during the 1995-1998 lymphedema management study (P ≤ 0.01). This study represents the first longitudinal, quantitative, leg-specific analysis examining the clinical effect of diethylcarbamazine on lymphedema progression and ADL episodes. |
A reversal in reductions of child mortality in western Kenya, 2003-2009
Hamel MJ , Adazu K , Obor D , Sewe M , Vulule J , Williamson JM , Slutsker L , Feikin DR , Laserson KF . Am J Trop Med Hyg 2011 85 (4) 597-605 We report and explore changes in child mortality in a rural area of Kenya during 2003-2009, when major public health interventions were scaled-up. Mortality ratios and rates were calculated by using the Kenya Medical Research Institute/Centers for Disease Control and Prevention Demographic Surveillance System. Inpatient and outpatient morbidity and mortality, and verbal autopsy data were analyzed. Mortality ratios for children less than five years of age decreased from 241 to 137 deaths/1,000 live-births in 2003 and 2007 respectively. In 2008, they increased to 212 deaths/1,000 live-births. Mortality remained elevated during the first 8 months of 2009 compared with 2006 and 2007. Malaria and/or anemia accounted for the greatest increases in child mortality. Stock-outs of essential antimalarial drugs during a time of increased malaria transmission and disruption of services during civil unrest may have contributed to increased mortality in 2008-2009. To maintain gains in child survival, implementation of good policies and effective interventions must be complemented by reliable supply and access to clinical services and essential drugs. |
Viral shedding in patients infected with pandemic influenza A (H1N1) virus in Kenya, 2009
Waiboci LW , Lebo E , Williamson JM , Mwiti W , Kikwai GK , Njuguna H , Olack B , Breiman RF , Njenga MK , Katz MA . PLoS One 2011 6 (6) e20320 BACKGROUND: Understanding shedding patterns of 2009 pandemic influenza A (H1N1) (pH1N1) can inform recommendations about infection control measures. We evaluated the duration of pH1N1 virus shedding in patients in Nairobi, Kenya. METHODS: Nasopharyngeal (NP) and oropharyngeal (OP) specimens were collected from consenting laboratory-confirmed pH1N1 cases every 2 days during October 14-November 25, 2009, and tested at the Centers for Diseases Control and Prevention-Kenya by real time reverse transcriptase polymerase chain reaction (rRT-PCR). A subset of rRT-PCR-positive samples was cultured. RESULTS: Of 285 NP/OP specimens from patients with acute respiratory illness, 140 (49%) tested positive for pH1N1 by rRT-PCR; 106 (76%) patients consented and were enrolled. The median age was 6 years (Range: 4 months-41 years); only two patients, both asthmatic, received oseltamivir. The median duration of pH1N1 detection after illness onset was 8 days (95% CI: 7-10 days) for rRT-PCR and 3 days (Range: 0-13 days) for viral isolation. Viable pH1N1 virus was isolated from 132/162 (81%) of rRT-PCR-positive specimens, which included 118/125 (94%) rRT-PCR-positive specimens collected on day 0-7 after symptoms onset. Viral RNA was detectable in 18 (17%) and virus isolated in 7/18 (39%) of specimens collected from patients after all their symptoms had resolved. CONCLUSIONS: In this cohort, pH1N1 was detected by rRT-PCR for a median of 8 days. There was a strong correlation between rRT-PCR results and virus isolation in the first week of illness. In some patients, pH1N1 virus was detectable after all their symptoms had resolved. |
- Page last reviewed:Feb 1, 2024
- Page last updated:May 06, 2024
- Content source:
- Powered by CDC PHGKB Infrastructure